Feature Review

AI-Driven Phenotyping Platforms for Large-Scale Cotton Field Trials  

Xiaojing Yang , Xiaoyan Chen , Yuxin Zhu
Modern Agriculture Research Center, Cuixi Academy of Biotechnology, Zhuji, 311800, Zhejiang, China
Author    Correspondence author
Field Crop, 2025, Vol. 8, No. 3   
Received: 02 Apr., 2025    Accepted: 13 May, 2025    Published: 04 Jun., 2025
© 2025 BioPublisher Publishing Platform
This is an open access article published under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract

Cotton, as an important economic crop, its yield and quality are directly related to the development of the textile industry and agricultural economy. This study summarizes the relationship between the main agronomic traits of cotton (such as plant height, number of bolls, fiber quality, etc.) and yield and quality, discusses the urgent need for obtaining phenotypic data in large-scale field trials, as well as the significant value of phenotypic big data in cotton breeding and precise cultivation. At the technical level, it introduces the application of computer vision and deep learning in plant phenotypic identification The role of machine learning methods in the prediction and classification of cotton traits, as well as the automation technology of multimodal data fusion and feature extraction. In terms of data processing and analysis, this study explored key technologies such as image segmentation and extraction of cotton plant structure parameters, time series data analysis and growth dynamic monitoring, and correlation analysis between phenotypes and genotypes as well as environmental factors. It also analyzed the practical application and effect of the AI-driven cotton phenotype platform by combining large-scale experimental cases in cotton-growing areas of China and the United States. This study looks forward to the current challenges and proposes future development trends, aiming to provide references and inspirations for future cotton phenomics research, intelligent breeding and smart agriculture.

Keywords
Cotton; High-throughput phenotype; Artificial intelligence; Unmanned aerial vehicle remote sensing; Field experiment

1 Introduction

Cotton (Gossypium spp.) is an important economic crop related to the national economy and people's livelihood. Large-scale field trials are of great significance for screening superior varieties and optimizing cultivation measures. Every year, breeding units plant a large number of cotton varieties in different ecological zones for comparative trials to assess key agronomic traits such as yield, fiber quality and stress resistance. However, the expression of these traits is influenced by the complex interaction between genotype and environment. Accurately obtaining field phenotypic data is the core link in understanding the gene-phenotypic relationship and guiding breeding decisions. Phenotypic analysis runs through all stages of breeding experiments, from the evaluation of growth vigor at the seedling stage to the determination of yield and quality at the mature stage. It is an indispensable basis for screening high-yield, high-quality and stress-resistant varieties. Traditionally, researchers relied on manual measurement to record traits such as the height of cotton plants, the number of fruit branches, the number of bolls, and the quality of fibers. However, manual investigation is not only time-consuming and labor-intensive, but also prone to subjective biases and environmental disturbances, making it difficult to timely and comprehensively reflect the true differences of large group materials. This has made phenotypic data gradually become one of the bottlenecks restricting genetic improvement of cotton (Beegum et al., 2024). As breeding enters the era of big data, it is of strategic significance to develop highly efficient, objective and accurate phenotypic acquisition technologies. The emergence of high-throughput phenotypic analysis (HTP) technology has provided a solution to this problem. HTP rapidly monitors multiple traits related to crop growth, yield and stress resistance in the field through non-destructive means.

 

For a long time, the field phenotypic data of cotton have mainly relied on manual observation and simple instrument measurement, such as manual measurement of plant height and bell recording, and laboratory analysis of fiber quality, etc. These traditional methods have obvious limitations: (1) Low efficiency: Manual measurement consumes a large amount of human and material resources and is difficult to cover large experimental fields in a timely manner; (2) Subjective bias: Inconsistent standards among different investigators, resulting in poor data repeatability; (3) Limited spatiotemporal resolution: It is impossible to continuously monitor the growth dynamics of plants, and only data at limited time points can be obtained. (4) Single indicators: It is difficult to determine complex phenotypes such as canopy temperature and photosynthetic parameters in a timely manner by manual methods. With the development of information technology and artificial intelligence, emerging AI-driven phenotypic platforms are gradually overcoming these bottlenecks. The advancement of computer vision and sensor technology enables us to utilize equipment such as drones and robots to obtain massive amounts of crop growth images and environmental data in real time through cameras and various sensors. The rise of AI algorithms such as deep learning enables computers to automatically extract plant features from complex image data, achieving precise recognition and quantification of traits (Ampatzidis and Partel, 2020). Studies have shown that deep learning models such as convolutional neural networks perform exceptionally well in tasks like object detection and image segmentation, and have been successfully applied in phenotypic analyses such as plant organ recognition and pest and disease detection. In recent years, a number of intelligent platforms for field phenotyping have been developed successively at home and abroad, such as unmanned aerial vehicles equipped with multispectral cameras, high-throughput phenotyping tractor systems, and field automatic walking phenotyping robots, etc. (Ye et al., 2023). These AI-driven platforms can efficiently and objectively obtain a large amount of trait data of crops at different growth stages in the field, thereby significantly increasing the data output of field trials.

 

This study will systematically review the current application status and development trends of AI-driven phenotypic analysis in large-scale cotton field experiments, explore its role and prospects in cotton genetic breeding and precision agriculture, analyze the technical basis of AI-driven phenotypic analysis, and summarize the development of high-throughput phenotypic acquisition platforms. Compare the characteristics and advantages of air-based platforms (unmanned aerial vehicles, satellite remote sensing) and ground-based platforms (tractor modification systems, field robots), introduce the integrated application schemes of multiple sensors such as multispectral, hyperspectral, and thermal imaging in cotton phenotypic monitoring, and focus on the processing and analysis of phenotypic data. This paper discusses the extraction of cotton plant structure parameters by image segmentation and 3D reconstruction techniques, the monitoring of plant growth dynamics through time series data analysis, and how to conduct correlation analysis between phenotypic data and genotypes and environmental factors to analyze the genetic mechanism of traits. This study also analyzes the application practice of the AI phenotypic platform through actual cases: including large-scale phenotypic monitoring carried out in China's cotton-growing areas, the application of AI in field experiments and yield prediction in the United States' cotton-growing areas, as well as the performance and challenges encountered by the AI phenotypic platform in multi-site joint experiments. Through the above review and analysis, this study hopes to provide a reference for related research and promote AI phenotypic technology to better serve the genetic improvement and production management of cotton.

 

2 Cotton Phenotypic Traits and Field Trial Requirements

2.1 Major agronomic traits of cotton and their relationship to yield and fiber quality

The agronomic traits of cotton plants are rich and diverse, among which the most important ones include the length of the growth period, plant height, number of fruit branches, number of bolls, single boll weight, coat fraction (ratio of lint cotton to seed cotton), and fiber quality indicators (length, specific strength, Micronization value, etc.). These traits jointly determine the yield and quality performance of cotton. For instance, plant height and branch type affect population structure and light interception efficiency; The number of fruit branches and the weight of individual bolls directly determine the total number of bolls per unit area and the yield of individual bolls, and they are the key factors in the composition of yield. The fabric fraction reflects the proportion of the produced fibers. The length, strength and fineness (Macron value) of the fibers determine the textile quality of cotton fibers (Li, 2024).

 

In breeding practice, it is often necessary to balance yield and quality. Sometimes, there is a negative correlation trend between the two-high-yield varieties may have slightly inferior fiber quality, while high-quality fiber varieties often have lower yields. This makes a deep understanding of the relationships among the main agronomic traits particularly important. A large number of genetic statistical studies have revealed the correlation patterns among cotton traits. For instance, field experiment analyses of multiple cotton varieties (lines) have shown that the yield of lint cotton is significantly positively correlated with yield components such as the number of bolls per plant and the weight of each boll, and is often positively correlated with plant height. Taller plants usually produce more bolls and have higher yields. Meanwhile, there is a certain correlation between output and certain fiber quality characteristics. For instance, materials with higher output may have a moderately higher Macron value (an indicator of fiber fineness), but the fiber length may be shorter. Deng et al. (2020) conducted an experimental analysis of 63 new early-maturing upland cotton varieties and found that the seed cotton yield was extremely significantly positively correlated with the growth period, plant height, number of boll formation, and single boll weight, and was also positively correlated with the fiber Macron value and uniformity index. It is indicated that moderately extending the growth period, increasing the plant type, the number of knots and the weight of individual knots can simultaneously increase the yield and improve the fiber quality to a certain extent. On the other hand, environmental conditions have a regulatory effect on the relationship between traits. For instance, under conditions of sufficient water and fertilizer, increasing plant height and the number of branches is beneficial for enhancing yield. However, in drought or high-density planting, overly tall plants may instead lead to lodging and reduced yields.

2.2 Demand for phenotypic data acquisition in large-scale field trials

Large-scale field trials of cotton usually involve the planting comparison of numerous varieties (lines) at multiple locations and in multiple seasons to assess the high yield, stable yield and adaptability of the materials. The characteristic of this type of experiment is that it is extremely large in scale: a single experiment may involve hundreds of materials, which are repeatedly planted in different environments. How to obtain detailed and reliable phenotypic data on such a large experimental scale is a major challenge facing researchers. First of all, in experiments involving hundreds or even thousands of cells, relying on traditional manual measurement is obviously not feasible-the human input is huge and difficult to complete in a timely manner, and the data at different locations during the same period is also difficult to ensure consistency (Adams et al., 2020). However, breeders urgently need comprehensive phenotypic information to discover superior materials and identify genes that control traits. Therefore, large-scale trials have put forward an urgent need for efficient acquisition of phenotypic data: (1) There is a need for measurement methods that can cover a large area and multiple materials at one time, and it is best to complete the data collection of the entire trial in a short time to eliminate the influence of environmental diurnal variation; (2) Objective and standardized measurements are needed to enhance the consistency of data from different observation personnel at different locations. (3) Multiple repeated measurements are required during the growth period to obtain information on the dynamic changes of traits. Only by meeting these requirements can the advantages of large-scale experimental design be fully leveraged to screen out types with truly outstanding genetic performance from a vast amount of materials, and ensure the credibility and stability of the screening results. The emergence of high-throughput phenotypic technology precisely meets the above demands. For instance, the unmanned aerial vehicle (UAV) remote sensing platform can obtain the canopy images and growth parameters of the entire experimental field in a single flight, enabling a single person to acquire data from thousands of plots in a single day and significantly enhancing efficiency. Studies show that during the rapid growth stage of cotton, the daily changes in traits such as plant height are significant. It is difficult to accurately capture these changes solely relying on manual labor. However, tools like drones can achieve high-frequency monitoring and obtain continuous growth curves. Ye et al. (2023) pointed out that during the rapid growth period of cotton, the daily growth of plants is very large, and it is "almost unrealistic" to carry out large-scale manual measurement. However, unmanned aerial vehicle remote sensing can accurately obtain the plant height dynamics of different materials throughout the field.

 

2.3 Value of phenotypic big data in cotton breeding and precision cultivation

In terms of breeding, phenotypic big data provides a prerequisite for analyzing the genetic mechanisms of complex quantitative traits. Traditional QTL mapping and association analysis are often limited by the volume and accuracy of phenotypic data, while high-throughput platforms can provide a larger number of more detailed phenotypic determinations, thereby enhancing the statistical power of detecting genetic effects. For instance, by integrating big data from multi-point experiments in different environments with genotyping data, it is possible to more reliably mine major and minor QTLS and identify key functional genes. Zhao (2019) pointed out that by integrating automated platform equipment and information technology means to obtain massive amounts of multi-scale, multi-habitat, and multi-source heterogeneous plant phenotypic data, and forming plant phenomics big data, the relationship among genotype, phenotype, and environment can be systematically and deeply explored from the omics height. This means that phenotypic big data will help us gain a more comprehensive understanding of the genetic structure of important quantitative traits in cotton, such as yield and quality, providing clear gene targets for molecular breeding. In fact, the breakthroughs in cotton genomics in recent years are closely related to the large-scale collection of phenotypic data. By using high-throughput field phenotypic screening combined with genome-wide association studies (GWAS), researchers have successfully identified major genes that affect traits such as plant height and drought resistance. For instance, Ye et al. (2023) conducted GWAS using multi-time series plant height data from drones and located plant height related loci on chromosomes A01 and A11. And the candidate genes GhUBP15 and GhCUL1 were identified. These findings provide new molecular tools for cultivating ideal plant types, high-yield and stress-resistant cotton varieties. Similarly, in the improvement of fiber quality, combining large-scale phenotypic assesses (such as fiber length and strength performance under different environments) is also helpful for analyzing the genetic basis of quality traits.

 

3 Technical Foundations of AI-Driven Phenotyping

3.1 Applications of computer vision and deep learning in plant phenotypic recognition

Computer vision (CV) is a technology that enables computers to "see" pictures and recognize objects. It has brought about significant changes in the automatic identification of plant phenotypes. In the past, the analysis of plant images mainly relied on manually designed image processing methods, such as setting color thresholds to separate green plants from soil, or using shape analysis methods to calculate the area of leaves. These traditional methods work quite well when the background is relatively simple and the target is clear, for instance, they can quickly obtain the number of plants and the coverage area. However, once in the field environment, where the light changes greatly and there are many weeds, these methods are prone to errors. In recent years, deep learning, especially convolutional neural networks (CNNS), has developed rapidly, bringing new breakthroughs to computer vision. Deep learning models can learn on their own how to recognize objects from a large number of images without the need for manual extraction of image features. In terms of plant recognition, CNN models have been applied to the detection of seedling plants, the statistics of leaf numbers, and fruit recognition, with an accuracy rate much higher than that of traditional methods (Zhang and Wang, 2024). For instance, Yang et al. (2025) used a deep learning model to identify diseases of cotton leaves, achieving a classification accuracy rate of approximately 98%. In terms of weed detection in cotton fields, deep convolutional networks can also distinguish cotton from weeds in complex backgrounds and automatically count and locate them. Deep learning can also be used to extract the phenotypic features of cotton organs. For instance, Wu et al. (2022) used images captured by drones to generate 3D point clouds, which can accurately extract structural information from cotton fields, such as plant height and canopy leaf area index. Convolutional networks can also perform image segmentation, dividing each pixel in the image into parts such as leaves, stems or bells, and then calculating the size and shape of each organ.

 

3.2 Role of machine learning in cotton trait prediction and classification

In the era of agricultural big data, machine learning (ML) technology has become an important tool for extracting patterns from massive phenotypic data, predicting traits and classifying them. Compared with traditional statistical regression models, machine learning (especially deep learning) can handle high-dimensional nonlinear data relationships and performs well in predicting important traits of cotton. Yield forecasting is one of the most typical applications of machine learning in the phenotypic analysis of cotton. Early studies mostly adopted empirical regression models (such as multiple linear, stepwise regression, etc.) to predict per-unit yield based on indicators such as vegetation index during the growth period, but these models were difficult to fit complex nonlinear relationships. Nowadays, deep learning models that integrate multi-source data have significantly improved prediction accuracy.

 

Ultimately, AI is not only capable of classifying images; it has indeed done a lot of "serious work" in agriculture. Take cotton for example. Want to predict the output? In the past, people relied on experience, but now with deep learning, a lot of trouble can be saved. Yang et al. (2025) mentioned that structures like convolutional layers and pooling layers can actually automatically extract useful features from complex data without the need for manual variable selection at all. Moreover, no matter where the data comes from, whether it is images or sensors, it can process them all together. However, this matter is not just about theory. The research also verified this point-when CNN models are used in combination with images captured by drones, the average error in plot level yield prediction can be controlled within 8%. Some teams have made further attempts, such as using the improved version of YOLOv8 to identify cotton bolls and then estimate the yield, with an error rate of only around 7.7%. To be honest, this is quite astonishing. Feng et al. (2025) developed a yield estimation method that combines multispectral remote sensing and machine learning. So, what was the result? If only one sensor is relied on, the effect is just so-so. However, once the visible light, red edge and near-infrared bands are all integrated, the prediction accuracy immediately increases. From this perspective, it seems that relying on machine learning to "piece together the puzzle" is much more reliable than the traditional method that only uses a single variable. Speaking of this, in fact, AI can do something else. It can not only estimate the yield but also categorize and "identify cotton". Tools such as support vector machines, random forests, and deep neural networks can be used to determine whether a variety is insect-resistant cotton or to identify growth stages such as the seedling stage, bud stage, and flowering stage.

 

3.3 Multi-modal data fusion and automated feature extraction

The phenotype of cotton is jointly influenced by genes, environment and planting management. Therefore, to understand the phenotype of cotton, it is necessary to combine data from different aspects for examination. Multimodal data fusion means analyzing data from various sources and of different types together. For instance, information such as images, spectra, meteorology and soil can be combined. This way, the situation of cotton can be understood more comprehensively and accurately. In the phenotypic analysis of cotton, multimodal fusion has become an important method to improve model performance and discover new problems. Sometimes, the optical image of the canopy alone may not be sufficient. For instance, it might not be able to distinguish whether there is a lack of nitrogen or water. But if thermal infrared images (which can show the leaf temperature) and soil moisture data are added, it will be easier to determine which problem it is. This "1+1>2" effect has been reflected in many studies. Wang et al. (2022) have done relevant work. By using images from the Sentinel-2 satellite and combining data from multiple time points with meteorological information, they not only improved the accuracy of cotton yield prediction but also identified which growth period is most suitable for yield estimation.

 

Multimodal data fusion can overcome the limitations of a single data source and provide a more comprehensive explanatory power for complex agronomic traits. Ai-driven platforms are naturally suitable for conducting multimodal fusion analysis. On the one hand, the development of sensor technology enables us to simultaneously obtain multimodal data: for instance, unmanned aerial vehicle (UAV) platforms can be equipped with RGB cameras to capture visible light images, multispectral cameras to obtain vegetation indices, hyperspectral imagers to obtain fine spectral curves, and thermal imagers to obtain temperature distributions, thereby collecting multimodal phenotypes in a single flight. Ground-based phenotype vehicles can also integrate LiDAR and imaging sensors to simultaneously obtain three-dimensional structural and spectral information. On the other hand, machine learning, especially deep learning models, provides a powerful framework for multimodal data fusion. Convolutional neural networks can have multiple branches to handle data of different modalities respectively, and then fuse them on high-level features. Models based on the attention mechanism can also automatically learn the weights of each modality (Zhang et al., 2024). Multimodal fusion not only enhances the prediction accuracy but also provides a new perspective for revealing the relationships between different data sources and traits.

 

Automated feature extraction is an important characteristic of AI phenotypic analysis. Traditional analysis often relies on manual feature selection, such as choosing specific band ratios as vegetation indices or selecting several major traits based on experience for regression. This kind of manual feature engineering is both time-consuming and may miss key information. Deep learning models can automatically learn multi-level features from raw data, such as automatically extracting the shape and texture features of disease spots or the spectral features of chlorophyll content from leaf images. Especially in multimodal data scenarios, automatic feature extraction has more advantages-the model can comprehensively consider various data modalities and extract useful information at different scales. For instance, CNN can simultaneously process canopy images of cotton at different heights, automatically capturing the growth characteristics of different growth stages through convolutional layers without the need to manually define which period or which index is most relevant.

 

4 High-Throughput Phenotyping Platforms and Sensor Integration

4.1 Advantages of UAVs and satellite remote sensing for large-scale cotton phenotyping

In recent years, unmanned aerial vehicles (UAVs) and satellite remote sensing technologies have been widely applied as high-throughput phenotypic data acquisition methods in cotton research. The unmanned aerial vehicle (UAV) remote sensing platform has outstanding advantages such as mobility, high resolution and easy operation, and is particularly suitable for phenotypic monitoring of cotton in large fields. Firstly, drones can fly at low altitudes to obtain canopy images with centimeter-level resolution, clearly presenting details at the level of individual plants and even organs, such as leaf color, the number of flowers and the degree of catl release, which is much higher than the resolution of satellite images. Studies show that the RGB images of unmanned aerial vehicles can accurately extract the structural parameters of cotton plant height by reconstructing three-dimensional point clouds, and the correlation with the measured ground height reaches more than 0.95. Psiroukis et al. (2023) utilized visible light images from unmanned aerial vehicles (UAVs) to generate digital surface models for estimating the height of cotton plants. The results were highly consistent with those measured manually (R² > 0.90), verifying the accuracy and reliability of UAV measurements. Secondly, unmanned aerial vehicles (UAVs) respond quickly and can flexibly adjust flight time and frequency according to demand, thereby achieving multi-temporal dynamic monitoring of phenotypic traits. For instance, aerial photography of the experimental fields can be conducted every week or even every few days to record the growth curves of cotton and key growth turning points, which is difficult to achieve through traditional manual operations. Secondly, the coverage range of the unmanned aerial vehicle (UAV) is moderate. A multi-rotor UAV can easily obtain data from dozens to hundreds of hectares of experimental fields in a single day, making it particularly suitable for large-scale experiments at research sites or breeding bases.

 

In contrast, although satellite data has a larger coverage area, it is often limited by spatiotemporal resolution and affected by cloudy and rainy weather. However, drones can take off and land at any time on fine days to obtain clear images. Practice has proved that in key agronomic links such as cotton defoliation and ripening, unmanned aerial vehicle (UAV) monitoring has played an irreplaceable role. Ma et al. (2021) utilized drones equipped with RGB cameras to capture images of cotton fields before and after defoliation in mechanical harvesting, and rapidly calculated the defoliation rate through vegetation indices, providing an efficient means for evaluating the effectiveness of defoliants. The model it established has shortened the manual investigation time from several days to just a few minutes, and the monitoring accuracy of the deleafing rate has reached over 90%, greatly improving the efficiency of related experiments on mechanical cotton harvesting. Satellite remote sensing has unique advantages in large-scale area monitoring and long-term sequence data. Satellite platforms (such as the European Sentinel-2 and the US MODIS, etc.) can cover the entire major cotton-producing areas and provide regional-scale vegetation growth and yield estimation information. Although satellite images have a relatively low resolution (typically 10 m to 30 m), making it difficult to analyze individual plant information, they have a wide coverage and a fixed period, making them suitable for macroscopic analysis.

 

4.2 Applications of ground-based phenotyping platforms (automated vehicle systems, robotics)

In addition to aerial platforms, ground-based high-throughput phenotypic platforms are also a current research hotspot, offering advantages in precisely capturing plant details and collecting data around the clock. Ground platforms mainly include two types: modified high-clearance vehicles (phenotype tractors) and field autonomous robots. The automatic vehicle-mounted phenotypic system is usually equipped with a variety of sensors on high-chassis tractors or tracked vehicles, such as high-definition cameras, LiDAR, spectrometers, environmental sensors, etc. When the vehicle travels between crop rows, the sensors can scan the plants on both sides at close range. Because the vehicle-mounted platform is closer to the plants, the resolution and accuracy of the data obtained are often better than those of the aerial platform. For instance, a high-clearance phenotype vehicle developed by the United States is equipped with four adjustable robotic arms between cotton rows. Each robotic arm is integrated with RGB cameras and LiDAR, enabling it to obtain the three-dimensional structure of the cotton canopy and plants from multiple angles at close range. This system can measure traits such as plant height, crown width and leaf area index (LAI) throughout the entire growth period without touching the plants, and can move among tall stems, not limited by the growth of cotton and the size of the field (Jiang et al., 2018). Some vehicle-mounted phenotypic platforms have also been developed domestically. For instance, the cotton phenotypic tractor developed by the Nanjing Institute of Agricultural Mechanization can simultaneously collect multi-spectral images of the canopy and ultrasonic ranging data, enabling automatic measurement of the height and density of rows of plants. This type of platform features independent power supply and all-weather operation, making it suitable for regular field data collection Tours.

 

The phenotypic robot is equipped with RTK differential GPS, lidar, etc., to achieve precise positioning and obstacle avoidance navigation. Because the robot is closer to the plant, it can be equipped with high-precision sensors such as microscopic imaging and close-range spectral probes to obtain information at the organ level of the crop. For instance, a certain cotton-phenotypic robot abroad uses a mechanical arm to extend into the canopy to capture high-resolution images of the leaves, and analyzes the leaf lesions and nutritional status. There are also robots installing ground spectrometers at the bottom to measure the spectra intercepted in the lower part of the cotton canopy to evaluate the light energy utilization rate of the plants (Sun et al., 2017). Of course, ground platforms also have their limitations, such as a smaller coverage area than aerial platforms and possible restrictions on movement in muddy fields. However, for experimental fields and breeding nurseries, ground platforms offer the close-up observation capabilities required for fine phenotypic measurements. Especially in the acquisition of cotton traits such as stem thickness, internode length, and the number of buds and bolls, ground platforms are more suitable.

 

4.3 Integration of multispectral, hyperspectral, and thermal imaging sensors

The powerful functions of the high-throughput phenotypic platform cannot do without the "firepower support" of various advanced sensors. In response to the different phenotypic characteristics of cotton, the main sensors currently integrated into the platform include multispectral cameras, hyperspectral imagers, and thermal infrared cameras, etc. Each of them has its own strengths and, when working together, can capture crop information from multiple angles. A typical multispectral sensor can simultaneously obtain images in the red, green, blue (visible light), red edge, near-infrared and other bands. The vegetation indices calculated thereby (such as NDVI, EVI, etc.) are closely related to biomass parameters such as leaf area index and chlorophyll content. For instance, in the research on drought resistance of cotton, the use of multispectral unmanned aerial vehicles to obtain vegetation indices during the flowering and boll-forming period can quickly estimate the SPAD value and water content of the population leaves, providing a basis for screening drought-tolerant varieties. The experiment of Li et al. (2023) used 253 cotton varieties as materials. Under normal irrigation and drought stress conditions, the cotton canopy images were obtained by DJI Jingling 4 multispectral unmanned aerial vehicle, multiple spectral indices were extracted, and a model was established to estimate leaf nitrogen nutrition (SPAD) and water content. The results show that the multispectral index has a relatively high prediction accuracy for the key physiological indicators of cotton leaves. Multispectral cameras, due to their low cost and simple data processing, are one of the most widely used phenotypic sensors at present. They are often used to monitor growth differences, nitrogen nutrition diagnosis, and the maturity of catching, etc.

 

Hyperspectral cameras can obtain continuous spectra in hundreds of narrow bands from visible light to near-infrared and are thus called "spectral lie detectors". Compared with multispectral cameras, hyperspectral cameras offer more abundant spectral information and can detect subtle physiological changes in plants. The hyperspectral reflectance curve of cotton leaves contains information such as chlorophyll, carotenoids, moisture, and phenology. By analyzing the characteristic bands or spectral indices, the nutritional and stress conditions can be evaluated more accurately. For instance, a study by Thorp et al. (2024) deployed a field hyperspectrometer to monitor the reflectance of cotton leaves. By combining 148 spectral indices and multiple machine learning methods, the chlorophyll content of the leaves was predicted. As a result, the R2 of the best model reached 0.88. Hyperspectral remote sensing is also used for the early detection of cotton diseases.

 

The temperature of leaves can actually reveal a lot about their "internal conditions". Like thermal infrared cameras, they determine whether crops are short of water and how fast they are evaporating by detecting the radiation temperature on the surface of the canopy. Generally speaking, when there is less water or the stomata stop opening, the temperature of the leaves will rise rapidly. Therefore, the canopy temperature has become an "indicator light" for judging the degree of water stress and irrigation requirements of cotton. However, not everyone uses this method from the very beginning. Nowadays, it is popular to use drones in combination with thermal imaging technology to fly around the fields to see where the temperature is high and there may be a lack of water, and then decide whether to replenish water and how much to replenish. In some studies, this method has been used quite maturely. O'Shaughnessy et al. (2023) conducted an AI-driven irrigation experiment, where they combined thermal imaging with Internet of Things (iot) sensors to optimize the irrigation strategy. The result was also quite impressive-20% to 35% of water was saved, but the output did not drop. It seems that water conservation is really not something that can be accomplished on a whim.

 

5 Applications of AI-driven Phenotyping Platforms in Data Processing and Analysis

5.1 Image segmentation and extraction of cotton plant structural parameters

Image segmentation is the process of dividing an image into several regions of interest. In the phenotypic analysis of cotton, a typical task is to separate the background soil, weeds and cotton plants, and further divide the plants into different organs (leaves, stems, bolls, etc.). The traditional threshold segmentation method is feasible in simple backgrounds, but its accuracy is not high when facing complex field backgrounds. Deep learning provides more robust semantic segmentation schemes. Networks such as U-Net and Mask R-CNN can learn the shape and texture features of cotton leaves and bell shells using artificially labeled training data, and can accurately outline the plant contusions and organ regions even in complex backgrounds. For instance, some studies have used Mask R-CNN to perform instance segmentation on cotton floss images during the floss opening period, which can separate each floss from the background and count it, providing a basis for evaluating the floss opening rate and the timing of harvest (Adke et al., 2022). The results of image segmentation can also be used to calculate plant type parameters, such as canopy coverage (proportion of green pixels), projected area, etc., as indicators of vegetation growth. For the division and identification of high-density plants, deep learning segmentation combined with connected domain analysis technology can automatically count the number of seedlings emerging in the field and the seedling spacing, which is faster and more accurate than manual counting.

 

After the image segmentation is completed and a clean cotton plant image is obtained, some structural parameters can be extracted. One of the most common parameters is the plant height. In the past, when measuring the height of a plant, a ruler was usually used, measuring from the ground to the top of the plant. This method was relatively slow and prone to errors. Nowadays, three-dimensional reconstruction technology can be used to measure the height and structural data of the plant without touching it. For instance, after processing the low-altitude images captured by drones using the Structure from Motion (SfM) algorithm, high-density point clouds of the experimental field can be generated. By subtracting the Digital Ground Model (DEM) from the Digital Surface Model (DSM), the average plant height of each plot can be calculated. Another common structural parameter is the Leaf Area Index (LAI), which indicates the number of leaf areas per unit of ground. LAI can reflect the growth condition of cotton and its ability to absorb light energy. In the past, to measure LAI, it was generally done by direct measurement or with instruments (such as LAI-2200). Nowadays, LAI can also be estimated by combining AI platforms with remote sensing technology. In the study by Wu et al. (2022), they used a point cloud model generated by a drone to not only monitor the changes in cotton plant height but also estimate the LAI after defoliation treatment. The research found that three days after the application of defoliant, the R² between the LAI estimated by point cloud and the measured value reached 0.872. This indicates that as long as there are high-resolution images combined with three-dimensional information, the changes in leaf area can be dynamically tracked. This can be used to judge the defoliation effect or assess the aging condition of the plant. In addition to plant height and LAI, the AI platform can also extract some structural information of cotton that is difficult to measure by hand.

 

5.2 Time-series data analysis and growth dynamics monitoring

A direct application of time series data analysis is the quantification of growth rates. Traditionally, we have roughly understood the early or late maturity of a variety by measuring the traits at several stages such as the seedling stage, bud formation stage, flowering stage and catkins stage. Nowadays, it is possible to monitor on a daily basis or even at a higher frequency by using drones and ground sensors. For instance, Ye et al. (2023) utilized drones to measure the plant height of 320 upland cotton materials at three different locations in a time series, obtaining the plant height values of each material at multiple time points. By conducting principal component analysis (PCA) on these time series data, they reduced the dimension of the plant height growth curve and extracted two main components: the first component represents the average plant height level, and the second component reflects the difference in growth rate. The results showed that the materials were classified into different types. Some were tall and grew fast as a whole, some were short and grew slowly, and some were of medium height but grew rapidly in the early stage and slowed down in the later stage (Figure 1). This analysis reveals information that traditional endpoint measurements cannot provide-the growth dynamic patterns of different materials. For breeding, this helps to screen out variety combinations that grow fast during the seedling stage and stably after flowering, and can also discover some materials with late-blooming advantages (growing fast in the later stage and possibly achieving high yields in specific environments).

 

  

Figure 1 Flow chart of this study (Adopted from Ye et al., 2023)

Image caption: (a) Ground control points (GCPs) marked by red circles were evenly arranged in the trial fields. Each accession had about 10-20 plants and was grown in a plot of 3  m × 0.6  m in size. (b) The unmanned aerial vehicle (UAV) remote-sensing platform (DJI Phantom4 RTK) was applied in this study to obtain visible images. (c) Ground-truth plant height (PH) of samples and coordinates of GCPs were measured by ruler and RTK, respectively. (d) Image processing and PH extraction process. The difference between the first and the 95th percentiles was used to extract PH from the on-season digital surface model (DSM) of the plot. (e) UAV-based PH was used for GWAS and the associated candidate genes were identified (Adopted from Ye et al., 2023)

 

Through time series monitoring, we can identify which stages in the growth of cotton are the most critical and also see how much impact these stages have on the yield. For instance, if the analysis of NDVI or LAI time series data from multiple locations over several consecutive years reveals that a certain stage (such as the initial flowering period) has the strongest relationship with yield, it indicates that this stage is of great significance. In breeding and field management, special attention should be paid to the growth conditions during this period. In actual production, it is often said that the "peach setting period" (that is, the flowering and bell-bearing period) is particularly important. This is because this stage determines how many bolls the cotton can produce and how big each boll can grow. Now, we can prove this matter with data. If the time series data shows that within two weeks after flowering, the LAI of some plots rises rapidly and the final yield is also high, it indicates that the increase in leaf area at this stage is very helpful for the yield. In this way, we can, based on this rule, increase the input of water and fertilizer during this period to boost the output. This is like installing a "warning system" during the growing season. It can tell us whether the cotton is growing well. If it doesn't meet the standards, we can also adjust the management methods in time.

 

5.3 Association analysis of phenotype, genotype, and environmental factors

In recent years, with the development of high-throughput genotyping technology, we can obtain whole-genome marker information (such as SNPS) of cotton materials, while high-throughput phenotypic platforms provide a vast amount of trait data. The combination of the two has given rise to a new paradigm of the integration of "phenomics" and "genomics". In terms of genotype-phenotypic association analysis, genome-wide association analysis (GWAS) and quantitative trait loci (QTL) mapping are the main approaches. High-quality phenotypic data is directly related to the success or failure of association analysis. In the past, due to the large errors and few repetitions in artificial phenotypic data, only a few major genes could often be detected. Nowadays, AI phenotypic platforms provide more refined and multi-dimensional phenotypic indicators for association analysis, thereby enhancing the detection efficiency.

 

On the other hand, phenotypic-environmental interaction analysis is crucial for understanding variety adaptability and optimizing agricultural management. The phenotypic manifestations of cotton varieties vary in different environments, that is, there exists a "G×E interaction". Traditional breeding evaluates the wide adaptability and specific adaptability of varieties through multi-point experiments, but it is limited by the single artificial phenotypic indicators and it is difficult to deeply analyze the interaction mechanism. High-throughput phenotypes can provide rich trait data from various locations, offering materials for analyzing interactions. For instance, we can compare the differences in the sequential growth curves of the same set of materials in experiments in southern Xinjiang and central and southern Hebei, identify the types that still have high yields in southern Xinjiang but reduced yields in the north, and then analyze the reasons in combination with environmental data (temperature, light, etc.). If a certain variety still maintains a high LAI and photosynthetic rate in high-temperature areas, while the leaf area of another variety rapidly decreases under high temperatures, it reveals the difference in heat tolerance between the two. This kind of analysis can be achieved by combining statistical models (such as AMMI models, GGE double-map) with multi-point phenotypic data, or by introducing machine learning to predict variety performance based on environmental variables, and then infer the source of interaction effects. For instance, Xu et al. (2017) utilized a dual-standard analysis of genotype and trait to optimize the registration criteria for cotton varieties. In essence, this was an analysis of the interaction between traits and the environment, aiming to identify trait indicators that can both reflect genetic differences and are robust. High-throughput phenotypic data will make such analyses more accurate.

 

6 Case Studies: Practical Applications of AI-Driven Cotton Phenotyping Platforms

6.1 Large-scale phenotyping practices in Chinese cotton regions

China is a major country in cotton cultivation and scientific research. In recent years, China has made many attempts in phenotypic monitoring of large-scale cotton fields. A typical example is the phenotypic identification of a large number of cotton germplasm resources in major production areas such as Xinjiang. The ecological environment in Xinjiang is rather unique, and it is also rich in cotton resources. However, to find germplasm that is both stress-resistant and high-yielding from so many materials, it would be very slow and arduous to use traditional methods. To enhance efficiency, researchers have introduced a high-throughput phenotypic platform to conduct large-scale and precise field determinations on hundreds of cotton germplasms. For instance, Professor Zhang Xianlong's team from Huazhong Agricultural University collaborated with research institutions in Xinjiang to conduct high-throughput phenotypic monitoring using drones. They screened over 1 800 cotton hybrid offspring materials in terms of plant height, leaf traits, and yield. Through high-frequency remote sensing monitoring, they selected 53 cotton materials with particularly strong drought resistance in the fields. Even in the case of severe drought, when the irrigation water volume was reduced by 50%, the output of these materials did not decline. Next, the research team combined molecular marker analysis and applied these excellent materials in breeding, eventually developing new water-saving and drought-resistant varieties like "Jinken 1161". This case demonstrates that high-throughput phenotypic technology has played a significant role in the screening of cotton resources and stress-resistant breeding in China, and has also greatly accelerated the breeding speed.

 

High-throughput phenotypic monitoring has also been carried out at the Nanfan base in Hainan. There are numerous plots for the southern breeding and generation increase experiment, which poses a great challenge to manual management. The Chinese Academy of Agricultural Sciences has established the National Nanfan Crop Phenotyping Center in Sanya, equipped with automated phenotyping carts and unmanned aerial vehicle systems, to conduct all-weather monitoring of crops such as cotton and rice in the experimental fields (Zhang et al., 2024). It is reported that the ground phenotypic vehicle of the center can automatically tour multiple experimental fields every day, monitor the plant height and growth progress of cotton, and can transmit the data to the remote server in real time. Breeders can view the growth curves and on-site images of each material through their mobile phones or computers, keep abreast of the progress of experiments in a timely manner and detect any abnormal situations. This remote digital monitoring model has particularly played a role during the epidemic, enabling breeders who were unable to go to the site to "select seedlings remotely".

 

6.2 AI-driven field trials and yield prediction applications in U.S. cotton regions

In the United States, several cotton-growing states have established high-throughput field phenotypic facilities. Among them, the most well-known is the USDA (USDA-ARS) Crop Field High-throughput Phenotyping Facility located in Maricopa, Arizona. This base is equipped with a large Field Scanalyzer high-throughput automated phenotypic frame (mainly used for wheat, etc.), but for cotton, experiments and monitoring are mainly conducted using a combination of unmanned aerial vehicles and ground platforms. Thorp et al. (2024) reported a study on the assessment of chlorophyll content in cotton leaves using ground hyperspectral and machine learning in the Arizona cotton region. They conducted a four-year field experiment in Maricopa, repeatedly measuring the leaf reflectance of different cotton varieties using a high spectrometer carried on a handcart, and trained the model with the chlorophyll content analyzed in the laboratory as the standard. By comparing 148 spectral indices and 14 machine learning algorithms, they found that ensemble learning (such as random forest and gradient boosting) combined with red-edge band indicators could best predict chlorophyll, with an R² of up to 0.88. However, they also found that the model had difficulties in generalization across different years: when trained with 2019-2020 data and tested in 2021-2022, the prediction performance was poor (R² was only 0.46). This prompt requires calibration for environmental differences (Figure 2).

 

 

Figure 2 NSummary of measured chlorophyll (Chl) and spectral reflectance data collected from cotton leaves during field studies at Maricopa, Arizona, USA, including (A) area-basis Chl, (B) mass-basis Chl, and (C) the minimum, median, and maximum of spectral reflectance data from the 2019-2020 experiment and (D) area-basis Chl, (E) mass-basis Chl, and (F) the minimum, median, and maximum of spectral reflectance data from the 2021-2022 experiment (Adopted from Thorp et al., 2024)

 

In terms of production prediction and precision agriculture, a large number of cases of AI application have emerged in the cotton-growing areas of the United States. Feng et al. (2022) comprehensively considered soil, meteorological and unmanned aerial vehicle remote sensing information and used deep learning to predict cotton yield. They set up field trials in Missouri to measure yields under different treatments and collect soil texture, seasonal meteorological data, and mid-growth drone images. By fusing these heterogeneous data through the CNN model for training, an accurate prediction of the experimental yield was achieved. The results show that the prediction accuracy of the fusion model is significantly higher than that of using only a single data source, demonstrating the powerful ability of multi-source AI models in cotton yield prediction.

 

6.3 Performance and challenges of AI phenotyping platforms in multi-location trials

Multi-location Trials are not new. They have always been an important method for evaluating the stability and adaptability of cotton varieties. It's just that now with the support of AI, the situation is a bit different. On the one hand, AI-driven phenotypic platforms have indeed significantly improved the efficiency of experiments and the quality of data. However, on the other hand, not all problems have been solved, and challenges have also emerged. Take data acquisition as an example. In the past, during national cotton regional trials, many pilot projects could only record a few indicators-yield, quality, and a few basic agronomic traits. Nowadays, some places have introduced unmanned aerial vehicle (UAV) monitoring. Not only is the efficiency higher, but the data obtained is also more detailed. For instance, the vegetation index and plant height at different time points throughout the entire growth period can all be observed. This additional information can precisely help us understand where the yield differences among varieties come from. For instance, in the cotton-growing areas of North China, a regional trial discovered an interesting phenomenon: the NDVI of two high-yield new varieties at the end of flowering was significantly higher than that of the control variety, indicating that their leaves could still function well in the later stage and did not decline prematurely. Sure enough, the final result also confirmed this point-not only were the single bells of these two materials heavy, but the number of knots was also large. In contrast, for those materials with low yields, NDVI drops rapidly after flowering and ages significantly earlier (Gu et al., 2024).

 

However, in practice, it has also been found that multi-site trials have brought some challenges to the operation of the AI phenotypic platform. The first issue is data standardization. The environmental background, lighting conditions and operation methods of different test sites may vary, resulting in systematic deviations in the obtained remote sensing data. The second is the issue of platform adaptability. The field conditions at different test sites vary greatly, and there are different requirements for the flight of unmanned aerial vehicles and the operation of robots. Third, the challenge of model generalization. When applying machine learning models for prediction in multi-point experiments, it is often encountered that the model trained in one environment performs poorly in another. Fourth, the pressure of data management and analysis generated by large-scale multi-point trials cannot be ignored either. The high-frequency remote sensing data from dozens or even hundreds of pilot projects, when collected, form a huge database. To effectively store and manage metadata (such as variety numbers and pilot information) and provide it for researchers' use, it is necessary to establish a centralized and unified database platform.

 

7 Future Prospects

Looking to the future from the forefront of current technological development, it can be seen that the AI-driven cotton phenotypic analysis platform is evolving towards greater intelligence, efficiency and deep integration, which will also provide strong support for the development of smart agriculture. Platform devices will be more cost-effective, automated and ubiquitous. At present, high-throughput phenotypic devices are relatively expensive and have a high usage threshold, which limits their large-scale application in the production field. One of the future development trends is cost reduction: with the large-scale production of drones and sensors, their prices will continue to decline, and the hardware investment required for AI phenotypic analysis will no longer be prohibitive. Moreover, the emergence of more open-source and low-cost components (such as open-source agricultural robot projects) will give rise to affordable versions of platforms. The development of new low-cost field intelligent phenotypic acquisition and analysis equipment will be a key focus.

 

AI analysis algorithms will tend towards higher levels of intelligence and integrated decision-making. At present, most AI models operate independently for specific traits or tasks, such as yield prediction and disease detection. The future development trend is to build multi-task joint models or digital twin systems to achieve all-round simulation and decision support for crop growth. The deep learning model will not only tell us "what state the plant is in", but also further answer "what measures need to be taken". The AI phenotypic platform will be deeply integrated with genomics, breeding information systems and agricultural machinery operation systems, giving rise to a new type of intelligent agricultural ecosystem. In terms of breeding, the synergy of phenomics and genomics will accelerate "intelligent breeding". Algorithms trained with phenotypic big data can help breeders eliminate inferior materials at an early stage and predict unmeasured environmental performance, thereby improving the efficiency of breeding selection. From a macro perspective, the widespread application of AI-driven phenotypic platforms will bring about social and economic benefits as well as changes in scientific research paradigms. As an important economic crop, the digitalization and intelligence of cotton production will increase the output per unit area, reduce resource consumption and environmental pollution, and promote sustainable agricultural development.

 

Of course, we also need to stay clear-headed. Intelligent technology is not omnipotent, especially in a complex field like agriculture, where AI systems sometimes make mistakes or even fail. So, in the future, it will still be necessary for humans and machines to work together. AI can handle large amounts of data, while human experts make decisions and deal with unexpected situations. For instance, AI can initially offer management suggestions, but whether to implement them in the end still depends on the agronomist's decision based on their experience and risks. This human-machine combination is very likely to become the mainstream approach in future smart agriculture. Although AI is becoming increasingly mature and people will gradually trust its judgments more, necessary human monitoring and intervention still cannot be omitted.

 

Acknowledgments

We are grateful to Dr. W. Zhang for this assistance with the serious reading and helpful discussions during the course of this work.

 

Conflict of Interest Disclosure

The authors affirm that this research was conducted without any commercial or financial relationships that could be construed as a potential conflict of interest.

 

References

Adams B., Ritchie G., and Rajan N., 2020, Cotton phenotyping and physiology monitoring with a proximal remote sensing system, Crop Science, 61(2): 1317-1327.

https://doi.org/10.1002/csc2.20434

 

Adke S., Li C., Rasheed K., and Maier F., 2022, Supervised and weakly supervised deep learning for segmentation and counting of cotton bolls using proximal imagery, Sensors, 22(10): 3688.

https://doi.org/10.3390/s22103688

 

Ampatzidis Y., and Partel V., 2020, UAV- and cloud-based application for high throughput phenotyping utilizing deep learning, In: 2020 ASABE annual international virtual meeting, American Society of Agricultural and Biological Engineers, 2020: 1.

https://doi.org/10.13031/aim.202000775

 

Beegum S., Hassan M., Ramamoorthy P., Bheemanahalli R., Reddy K., Reddy V., and Reddy K., 2024, Hyperspectral reflectance-based high throughput phenotyping to assess water-use efficiency in cotton, Agriculture, 14(7): 1054.

https://doi.org/10.3390/agriculture14071054

 

Deng Y.F., Xiao S.P., Yang X., Yang S.Q., Liu X.W., Ke X.S., and Wang T., 2020, Analysis of main agronomic traits of new early-maturing cotton lines and selection of elite varieties, Subtropical Agriculture Research, 3: 145-152.

https://doi.org/10.13321/j.cnki.subtrop.agric.res.2020.03.001

 

Feng A., Zhou J., Vories E., and Sudduth K., 2022, Quantifying the effects of soil texture and weather on cotton development and yield using UAV imagery, Precision Agriculture, 23(4): 1248-1275.

https://doi.org/10.1007/s11119-022-09883-6

 

Feng M.C., Su Y., Lin T., Yu X., Song Y., and Jin X.L., 2025, High throughput cotton yield estimation based on multi-source remote sensing data from unmanned aerial vehicles and machine learning, Transactions of the Chinese Society of Agricultural Machinery, 56(3): 169-179.

https://doi.org/10.6041/j.issn.1000-1298.2025.03.017

 

Gu H., Mills C., Ritchie G.L., and Guo W., 2024, Water stress assessment of cotton cultivars using unmanned aerial system images, Remote Sensing, 16(14): 2609.

https://doi.org/10.3390/rs16142609

 

Jiang Y., Li C., Robertson J.S., Sun S., Xu R., and Paterson A., 2018, GPhenoVision: a ground mobile system with multi-modal imaging for field-based high throughput phenotyping of cotton, Scientific Reports, 8(1): 1213.

https://doi.org/10.1038/s41598-018-19142-2

 

Li Y.K., Guo X.Y., Zhang Y., Gu S.H., Zhang Y.J., and Wu S., 2023, Research progress on cotton phenotypic techniques, Jiangsu Agricultural Sciences, 11: 27-36.

 

Li Z., 2024, A review of cotton cultivation techniques for high yield, Cotton Genomics and Genetics, 15(6): 284-293.

https://doi.org/10.5376/cgg.2024.15.0027

 

Ma Y.R., Lü X., Qi Y.Q., Zhang Z., Yi X., Chen X.Y., Yan T.Y., Hou T.Y., 2021, Estimation of the defoliation rate of cotton based on unmanned aerial vehicle digital images, Cotton Science, 4: 347-359.

https://doi.org/10.11963/cs20210003

 

O'Shaughnessy S., Colaizzi P., and Bednarz C., 2023, Sensor feedback system enables automated deficit irrigation scheduling for cotton, Frontiers in Plant Science, 14: 1149424.

https://doi.org/10.3389/fpls.2023.1149424

 

Psiroukis V., Papadopoulos G., Kasimati A., Tsoulias N., and Fountas S., 2023, Cotton growth modelling using UAS-derived DSM and RGB imagery, Remote Sensing, 15(5): 1214.

https://doi.org/10.3390/rs15051214

 

Sun S., Li C., and Paterson A., 2017, In-field high-throughput phenotyping of cotton plant height using LiDAR, Remote Sensing, 9(4): 377.

https://doi.org/10.3390/rs9040377

 

Thorp K., Thompson A.L., and Herritt M., 2024, Phenotyping cotton leaf chlorophyll via in situ hyperspectral reflectance sensing, spectral vegetation indices, and machine learning, Frontiers in Plant Science, 15: 1495593.

https://doi.org/10.3389/fpls.2024.1495593

 

Wang H.H., Zhang Z., Kang X., Lin J., Yin C., Ma L., Huang C., and Lü X., 2022, Cotton planting area extraction and yield prediction based on Sentinel-2A, Transactions of the Chinese Society of Agricultural Engineering, 38(10): 205-214.

 

Wu W., Wen W., Zhang Y., Wang Y., and Liu J., 2022, Estimation of cotton canopy parameters based on unmanned aerial vehicle (UAV) oblique photography, Plant Methods, 18(1): 129.

https://doi.org/10.1186/s13007-022-00966-z

 

Xu N., Fok M., Li J., Yang X., and Yan W., 2017, Optimization of cotton variety registration criteria aided with a genotype-by-trait biplot analysis, Scientific Reports, 7(1): 17237.

https://doi.org/10.1038/s41598-017-17631-4

 

Yang Y.Z., Xia W.K., Chu H.Q., Su W.H., Wang R.F., and Wang H.H., 2025, A comprehensive review of deep learning applications in cotton industry: from field monitoring to smart processing, Plants, 14(10): 1481.

https://doi.org/10.3390/plants14101481

 

Ye Y., Wang P., Zhang M., Abbas M., Zhang J., Liang C., Wang Y., Wei Y., Meng Z., and Zhang R., 2023, UAV-based time series phenotyping reveals the genetic basis of plant height in upland cotton, The Plant Journal, 115(4): 937-951.

https://doi.org/10.1111/tpj.16272

 

Zhang J.H., Yao Q., Zhou G.M., Wu W.D., Xiu X.J., and Wang J., 2024, Intelligent identification of crop agronomic traits and morphological structure phenotypes: a review, Smart Agriculture, 6(2): 14-27.

https://doi.org/10.12133/j.smartag.SA202401015

 

Zhang Q., and Wang Y., 2024, AI in biology: transforming genomic research with machine learning, Computational Molecular Biology, 14(3): 106-114.

https://doi.org/10.5376/cmb.2024.14.0013

 

Zhao C.J., 2019, Big data of plant phenomics and its research progress, Journal of Agricultural Big Data, 1(2): 5-18.

https://doi.org/10.19788/j.issn.2096-6369.190201

 

Field Crop
• Volume 8
View Options
. PDF
. HTML
Associated material
. Readers' comments
Other articles by authors
. Xiaojing Yang
. Xiaoyan Chen
. Yuxin Zhu
Related articles
. Cotton
. High-throughput phenotype
. Artificial intelligence
. Unmanned aerial vehicle remote sensing
. Field experiment
Tools
. Post a comment